Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Aldrich, Jonathan; Silva, Alexandra (Ed.)Many important security properties can be formulated in terms of flows of tainted data, and improved taint analysis tools to prevent such flows are of critical need. Most existing taint analyses use whole-program static analysis, leading to scalability challenges. Type-based checking is a promising alternative, as it enables modular and incremental checking for fast performance. However, type-based approaches have not been widely adopted in practice, due to challenges with false positives and annotating existing codebases. In this paper, we present a new approach to type-based checking of taint properties that addresses these challenges, based on two key techniques. First, we present a new type-based tainting checker with significantly reduced false positives, via more practical handling of third-party libraries and other language constructs. Second, we present a novel technique to automatically infer tainting type qualifiers for existing code. Our technique supports inference of generic type argument annotations, crucial for tainting properties. We implemented our techniques in a tool TaintTyper and evaluated it on real-world benchmarks. TaintTyper exceeds the recall of a state-of-the-art whole-program taint analyzer, with comparable precision, and 2.93X-22.9X faster checking time. Further, TaintTyper infers annotations comparable to those written by hand, suitable for insertion into source code. TaintTyper is a promising new approach to efficient and practical taint checking.more » « lessFree, publicly-accessible full text available June 25, 2026
-
Null-pointer exceptions are serious problem for Java, and researchers have developed type-based nullness checking tools to prevent them. These tools, however, have a downside: they require developers to write nullability annotations, which is time-consuming and hinders adoption. Researchers have therefore proposed nullability annotation inference tools, whose goal is to (partially) automate the task of annotating a program for nullability. However, prior works rely on differing theories of what makes a set of nullability annotations good, making comparing their effectiveness challenging. In this work, we identify a systematic bias in some prior experimental evaluation of these tools: the use of “type reconstruction” experiments to see if a tool can recover erased developer-written annotations. We show that developers make semantic code changes while adding annotations to facilitate typechecking, leading such experiments to overestimate the effectiveness of inference tools on never-annotated code. We propose a new definition of the “best” inferred annotations for a program that avoids this bias, based on a systematic exploration of the design space. With this new definition, we perform the first head-to-head comparison of three extant nullability inference tools. Our evaluation showed the complementary strengths of the tools and remaining weaknesses that could be addressed in future work.more » « lessFree, publicly-accessible full text available June 19, 2026
-
Free, publicly-accessible full text available May 12, 2026
-
Free, publicly-accessible full text available May 1, 2026
-
Free, publicly-accessible full text available April 1, 2026
-
Free, publicly-accessible full text available February 1, 2026
-
Free, publicly-accessible full text available February 18, 2026
-
Free, publicly-accessible full text available March 1, 2026
-
Quantum algorithms will likely play a key role in future high-performance-computing (HPC) environments. These algorithms are typically expressed as quantum circuits composed of arbitrary gates or as unitary matrices. Executing these on physical devices, however, requires translation to device-compatible circuits, in a process called quantum compilation or circuit synthesis, since these devices support a limited number of native gates. Moreover, these devices typically have specific qubit topologies, which constrain how and where gates can be applied. Consequently, logical qubits in input circuits and unitaries may need to be mapped to and routed between physical qubits. Furthermore, current Noisy Intermediate-Scale Quantum (NISQ) devices present additional constraints. They are vulnerable to errors during gate application and their short decoherence times lead to qubits rapidly succumbing to accumulated noise and possibly corrupting computations. Therefore, circuits synthesized for NISQ devices need to minimize gates and execution times. The problem of synthesizing device-compatible circuits, while optimizing for low gate count and short execution times, can be shown to be computationally intractable using analytical methods. Therefore, interest has grown towards heuristics-based synthesis techniques, which are able to produce approximations of the desired algorithm, while optimizing depth and gate-count. In this work, we investigate using genetic algorithms (GA)—a proven gradient-free optimization technique based on natural selection—for circuit synthesis. In particular, we formulate the quantum synthesis problem as a multi-objective optimization (MOO) problem, with the objectives of minimizing the approximation error, number of multi-qubit gates, and circuit depth. We also employ fuzzy logic for runtime parameter adaptation of GA to enhance search efficiency and solution quality in our proposed method.more » « lessFree, publicly-accessible full text available April 1, 2026
An official website of the United States government
